Entropy is the measure of the amount of energy not available for useful work. If you imagine a battery lighting a light bulb the battery has a certain amount of energy, and it has an ability to do work with that energy, determined by the desire for electrons in one end of the battery to be in the other end. In this case the battery would have a relatively low entropy, since it has a good ability to do work. Now if you let the battery light the light bulb for awhile, it has lost energy available for work (energy transformed into light and heat as it passes through the light bulb and the remaining energy makes it to the other end of the battery). In this case the batter would have lost energy, gained entropy, and is now in a state closer to equilibrium. Eventually the battery will come to a point where it's entropy will climb to a maximum, as in it will no longer have any energy available for useful work and the battery will be at a state of equilibrium, in the same way that if you have half a room filled with pressurized air and the other half at a normal level, it will not be long before the room is in a state of equilibrium, where the entire room is at the same air pressure. In a nutshell, entropy is the constant struggle for all of the energy in the universe, and if you remember E=mc^2 that means all mass too, to come to a state where there is no longer any energy available for work and the universe is at a perfect equilibrium. [1][2]
Image courtesy of anto-hendarto.blogspot.com
The word entropy was modeled with the word energy in mind, since entropy has such a strong tie to energy. The word is from the Greek en- ‘in’ and tropḗ ‘turning, transformation’. If you take a look at exactly how energy works, you will see that it comes in a variety of forms; electricity, nuclear, chemical, heat, etc. These forms of energy can easily transform from one to another, an example would be the radiant energy from the sun is turned into heat energy once it contacts your skin. Now when this energy is transformed it doesn't all go straight to heat, some of it remains radiant and bounces off your skin (which is why people can see you) along with that which is absorbed as heat. In a since you could say that the quality of this energy has diminished, since it is no longer in a single concentrated form. How does this pertain to entropy you ask? Well, since the energy which was once concentrated, and had an ability to do some useful work, has spread out into different places and forms, the ability to get work out of that remaining energy has diminished. Therefore, the entropy (the energy not available for work) of this system (the radiant heat) has increased. If you apply this to an engine, you understand further why it is theoretically impossible to create a perpetual machine (one that never loses energy) in the real world. As a machine runs it loses some of it's energy no matter how much of it's energy it tries to recycle and reuse, you can never reuse all of the energy the machine used in the first place. Furthermore, the second law of thermodynamics explains to us that the energy will be transformed and lose it's quality through heat, radiation, sound, etc., so the entropy of the machine, or it's lack of ability to perform work, will always increase.[3][4][5]
History of Entropy
Entropy was brought to life in the second law of thermodynamics, which was introduced by Rudolf
Clausius and William Thomson in 1850. Clausius simplified these laws and coined the term entropy, which he represented with S, in 1865. The simplified laws are as fallows, the energy in the universe is constant and the entropy of the universe tends towards a maximum. By this time entropy was defined as change in S = Q/T, where Q is the heat added to the system and T is the temperature of the system.Later, in 1885, Ludwig Boltzmann developed a definition of entropy that was purely biased in statistics. This definition, S=klogW, was Boltzmann's growing achievement, he was so proud of it that he had it engraved on his tombstone. The equation means the entropy is equal to the Boltzmann constant, k, times the log of the number of states that something can exist in, W. [6][7][8][9]
How Does Entropy Work
Entropy can easily be defined as the measurement of the dispersal of energy, or a measure of the amount of energy that has spread out. Energy, like everything else in the universe, wants to find itself in equilibrium; it wants to move from areas of high energy to areas of low energy. This can be observed with a hot cup of tea. As the energy in the tea, in the form of heat, is centralized to within the cup, the entropy is relatively low, meaning the hot cup has plenty of energy available to do work, like burn your tongue or, if you’re adventurous, power a mini steam powered turbine. As the heat (energy) of the water dissipates into the surrounding area; the air, the cup, and any unfortunate soul that happens to burn his lips. Since the energy in the water is leaving the cup the cup is not able to do as much work as it could do before, and once the cup has become cold it has a very low ability to do work, so it coincidentally has a larger entropy that it started with, because it has dispersed a good portion of it’s energy and has less ability to do useful work.Entropy is a tricky concept to wrap your head around. The basic principle of entropy, and the second law of thermodynamics, is that net entropy of anything is always going to be >= 0. To start understanding entropy you have to understand it in terms of disorder- the disorder of that in the microscopic scale. If we interpret Boltzmann's equation for a system at equilibrium, S=klogW, we see that entropy will increase as the organization is lost. W in the equation represents the number of possible states something can exist in. If we where to find the entropy of a die roll, using two six sided dice, we would have to calculate the total number of states it can exist in, aka every single possible outcome of rolling the dice. The number of states are dependent on every combination of numbers and which die is which number. So if die1 is 3 and die2 is 4, that’s a different state than die1 being 4 and die2 being 3. If you imagine the entropy of a gas, it would be the total possible states the gas can exist in a microscopic scale. By microscopic i mean that the gas can be in a jar, but in that jar the molecules of the gas can be arranged in various ways, while still appearing the same on a macroscopic scale. So, to find the entropy of the gas or dice, you’d have to calculate the total number of states and plug it into Boltzmann's equation. [10][11][12][13]
Entropy, thermodynamics and the universe
Image courtesy of cartoonstock.com
Before discussing the second law of thermodynamics, I will touch briefly on the first law. The first law of thermodynamics simply states that energy can not be created nor destroyed - but is conserved in all interactions and transformations. This means, for example, that if you have to brake suddenly while driving, then the energy of the car moving is not destroyed, but converted into other forms of energy, such as heat on the brake pads, the sound it makes, the heat on the road, and the movement of the earth, albeit minuscule. Now, the second law of thermodynamics states that all of the entropy in the universe is increasing in every interaction. This means that if you take any process, and at the end of the process, the entropy of the system is greater than it was to begin with (more disorder), then that is possible. However, if at the end of the process, entropy is smaller than it began with (less disorder), then that is not possible. For example, if you have a sealed jar with gas inside (a system), then you can open the jar and let the gas move into the environment. However, you cannot hold an empty jar and expect that gas to move from the environment back into the jar. That is impossible. Understand that the amount of gas is constant, no matter how big of an area it fills; the jar or the entire room. By relating this concept to entropy, I can safely say that the gas occupying the jar dispersed into the atmosphere, thus decreasing the concentration in the jar. However, that gas was increasing the concentration of the air in the room (or even the universe). Therefore, disorder is always increasing because it takes into consideration the system (the jar) and its environment (the room). As the second law of thermodynamics says, with every passing second the net entropy in the universe is increasing, constantly, with every action. This relates to the theory of the expansion of the universe, as the universe expands all the energy available is being spread/dispersed.[14][15][16]
Entropy
Table of Contents
What is Entropy?
Entropy is the measure of the amount of energy not available for useful work. If you imagine a battery lighting a light bulb the battery has a certain amount of energy, and it has an ability to do work with that energy, determined by the desire for electrons in one end of the battery to be in the other end. In this case the battery would have a relatively low entropy, since it has a good ability to do work. Now if you let the battery light the light bulb for awhile, it has lost energy available for work (energy transformed into light and heat as it passes through the light bulb and the remaining energy makes it to the other end of the battery). In this case the batter would have lost energy, gained entropy, and is now in a state closer to equilibrium. Eventually the battery will come to a point where it's entropy will climb to a maximum, as in it will no longer have any energy available for useful work and the battery will be at a state of equilibrium, in the same way that if you have half a room filled with pressurized air and the other half at a normal level, it will not be long before the room is in a state of equilibrium, where the entire room is at the same air pressure. In a nutshell, entropy is the constant struggle for all of the energy in the universe, and if you remember E=mc^2 that means all mass too, to come to a state where there is no longer any energy available for work and the universe is at a perfect equilibrium. [1] [2]
The word entropy was modeled with the word energy in mind, since entropy has such a strong tie to energy. The word is from the Greek en- ‘in’ and tropḗ ‘turning, transformation’. If you take a look at exactly how energy works, you will see that it comes in a variety of forms; electricity, nuclear, chemical, heat, etc. These forms of energy can easily transform from one to another, an example would be the radiant energy from the sun is turned into heat energy once it contacts your skin. Now when this energy is transformed it doesn't all go straight to heat, some of it remains radiant and bounces off your skin (which is why people can see you) along with that which is absorbed as heat. In a since you could say that the quality of this energy has diminished, since it is no longer in a single concentrated form. How does this pertain to entropy you ask? Well, since the energy which was once concentrated, and had an ability to do some useful work, has spread out into different places and forms, the ability to get work out of that remaining energy has diminished. Therefore, the entropy (the energy not available for work) of this system (the radiant heat) has increased. If you apply this to an engine, you understand further why it is theoretically impossible to create a perpetual machine (one that never loses energy) in the real world. As a machine runs it loses some of it's energy no matter how much of it's energy it tries to recycle and reuse, you can never reuse all of the energy the machine used in the first place. Furthermore, the second law of thermodynamics explains to us that the energy will be transformed and lose it's quality through heat, radiation, sound, etc., so the entropy of the machine, or it's lack of ability to perform work, will always increase.[3] [4] [5]
History of Entropy
Entropy was brought to life in the second law of thermodynamics, which was introduced by Rudolf
Clausius and William Thomson in 1850. Clausius simplified these laws and coined the term entropy, which he represented with S, in 1865. The simplified laws are as fallows, the energy in the universe is constant and the entropy of the universe tends towards a maximum. By this time entropy was defined as change in S = Q/T, where Q is the heat added to the system and T is the temperature of the system. Later, in 1885, Ludwig Boltzmann developed a definition of entropy that was purely biased in statistics. This definition, S=klogW, was Boltzmann's growing achievement, he was so proud of it that he had it engraved on his tombstone. The equation means the entropy is equal to the Boltzmann constant, k, times the log of the number of states that something can exist in, W. [6] [7] [8] [9]
How Does Entropy Work
Entropy can easily be defined as the measurement of the dispersal of energy, or a measure of the amount of energy that has spread out. Energy, like everything else in the universe, wants to find itself in equilibrium; it wants to move from areas of high energy to areas of low energy. This can be observed with a hot cup of tea. As the energy in the tea, in the form of heat, is centralized to within the cup, the entropy is relatively low, meaning the hot cup has plenty of energy available to do work, like burn your tongue or, if you’re adventurous, power a mini steam powered turbine. As the heat (energy) of the water dissipates into the surrounding area; the air, the cup, and any unfortunate soul that happens to burn his lips. Since the energy in the water is leaving the cup the cup is not able to do as much work as it could do before, and once the cup has become cold it has a very low ability to do work, so it coincidentally has a larger entropy that it started with, because it has dispersed a good portion of it’s energy and has less ability to do useful work. Entropy is a tricky concept to wrap your head around. The basic principle of entropy, and the second law of thermodynamics, is that net entropy of anything is always going to be >= 0. To start understanding entropy you have to understand it in terms of disorder- the disorder of that in the microscopic scale. If we interpret Boltzmann's equation for a system at equilibrium, S=klogW, we see that entropy will increase as the organization is lost. W in the equation represents the number of possible states something can exist in. If we where to find the entropy of a die roll, using two six sided dice, we would have to calculate the total number of states it can exist in, aka every single possible outcome of rolling the dice. The number of states are dependent on every combination of numbers and which die is which number. So if die1 is 3 and die2 is 4, that’s a different state than die1 being 4 and die2 being 3. If you imagine the entropy of a gas, it would be the total possible states the gas can exist in a microscopic scale. By microscopic i mean that the gas can be in a jar, but in that jar the molecules of the gas can be arranged in various ways, while still appearing the same on a macroscopic scale. So, to find the entropy of the gas or dice, you’d have to calculate the total number of states and plug it into Boltzmann's equation. [10] [11] [12] [13]
Entropy, thermodynamics and the universe
References
http://www.word-origins.com/definition/entropy.html
http://www.nuc.berkeley.edu/courses/classes/E-115/Slides/A_Brief_History_of_Thermodynamics.pdf__
http://scienceworld.wolfram.com/physics/Entropy.html
http://www.entropylaw.com/entropyenergy.html